- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources4
- Resource Type
-
0003000001000000
- More
- Availability
-
22
- Author / Contributor
- Filter by Author / Creator
-
-
Xu, Daguang (3)
-
Chen, Yiran (2)
-
Li, Wenqi (2)
-
Roth, Holger (2)
-
Roth, Holger R (2)
-
Sun, Jingwei (2)
-
Xu, Ziyue (2)
-
Yang, Dong (2)
-
Zhao, Can (2)
-
Avestimehr, Salman (1)
-
Bagchi, Saurabh (1)
-
Chan, Kevin (1)
-
Chaterji, Somali (1)
-
Chen, Beidi (1)
-
Chen, Tingjun (1)
-
Dimitriadis, Dimitris (1)
-
Guo, Pengfei (1)
-
Hatamizadeh, Ali (1)
-
Huang, Heng (1)
-
Li, Ang (1)
-
- Filter by Editor
-
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
& Spitzer, S.M. (0)
-
(submitted - in Review for IEEE ICASSP-2024) (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Deep learning has shown incredible potential across a wide array of tasks, and accompanied by this growth has been an insatiable appetite for data. However, a large amount of data needed for enabling deep learning is stored on personal devices, and recent concerns on privacy have further highlighted challenges for accessing such data. As a result, federated learning (FL) has emerged as an important privacy-preserving technology that enables collaborative training of machine learning models without the need to send the raw, potentially sensitive, data to a central server. However, the fundamental premise that sending model updates to a server is privacy-preserving only holds if the updates cannot be “reverse engineered” to infer information about the private training data. It has been shown under a wide variety of settings that this privacy premise doesnothold. In this article we provide a comprehensive literature review of the different privacy attacks and defense methods in FL. We identify the current limitations of these attacks and highlight the settings in which the privacy of an FL client can be broken. We further dissect some of the successful industry applications of FL and draw lessons for future successful adoption. We survey the emerging landscape of privacy regulation for FL and conclude with future directions for taking FL toward the cherished goal of generating accurate models while preserving the privacy of the data from its participants.more » « lessFree, publicly-accessible full text available September 30, 2026
-
Li, Yiming; Sun, Jingwei; Liu, Yudong; Zhang, Yuandong; Li, Ang; Chen, Beidi; Roth, Holger R; Xu, Daguang; Chen, Tingjun; Chen, Yiran (, ACM)Free, publicly-accessible full text available December 4, 2025
-
Sun, Jingwei; Xu, Ziyue; Yang, Dong; Nath, Vishwesh; Li, Wenqi; Zhao, Can; Xu, Daguang; Chen, Yiran; Roth, Holger R (, IEEE)
-
Xu, An; Li, Wenqi; Guo, Pengfei; Yang, Dong; Roth, Holger; Hatamizadeh, Ali; Zhao, Can; Xu, Daguang; Huang, Heng; Xu, Ziyue (, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR))
An official website of the United States government
